584 research outputs found

    Reconstructing vectorised photographic images

    Get PDF
    We address the problem of representing captured images in the continuous mathematical space more usually associated with certain forms of drawn ('vector') images. Such an image is resolution-independent so can be used as a master for varying resolution-specific formats. We briefly describe the main features of a vectorising codec for photographic images, whose significance is that drawing programs can access images and image components as first-class vector objects. This paper focuses on the problem of rendering from the isochromic contour form of a vectorised image and demonstrates a new fill algorithm which could also be used in drawing generally. The fill method is described in terms of level set diffusion equations for clarity. Finally we show that image warping is both simplified and enhanced in this form and that we can demonstrate real histogram equalisation with genuinely rectangular histograms

    Global patterns in the divergence between phylogenetic diversity and species richness in terrestrial birds

    Get PDF
    Aim The conservation value of sites is often based on species richness (SR).However, metrics of phylogenetic diversity (PD) reflect a community’s evolu-tionary potential and reveal the potential for additional conservation valueabove that based purely on SR. Although PD is typically correlated with SR,localized differences in this relationship have been found in different taxa.Here, we explore geographical variation in global avian PD. We identify wherePD is higher or lower than expected (from SR) and explore correlates of thosedifferences, to find communities with high irreplaceability, in terms of theuniqueness of evolutionary histories.Location Global terrestrial.Methods Using comprehensive avian phylogenies and global distributionaldata for all extant birds, we calculated SR and Faith’s PD, a widely appliedmeasure of community PD, across the terrestrial world. We modelled the rela-tionship between avian PD for terrestrial birds and its potential environmentalcorrelates. Analyses were conducted at a global scale and also for individualbiogeographical realms. Potential explanatory variables of PD included SR,long-term climate stability, climatic diversity (using altitudinal range as aproxy), habitat diversity and proximity to neighbouring realms.Results We identified areas of high and low relative PD (rPD; PD relative tothat expected given SR). Areas of high rPD were associated with deserts andislands, while areas of low rPD were associated with historical glaciation. Ourresults suggest that rPD is correlated with different environmental variables indifferent parts of the world.Main conclusions There is geographical variation in avian rPD, much ofwhich can be explained by putative drivers. However, the importance of thesedrivers shows pronounced regional variation. Moreover, the variation in avianrPD differs substantially from patterns found for mammals and amphibians.We suggest that PD adds additional insights about the irreplaceability of com-munities to conventional metrics of biodiversity based on SR, and could beusefully included in assessments of site valuation and prioritizatio

    Efficacy of pimobendan in the prevention of congestive heart failure or sudden death in doberman pinschers with preclinical dilated cardiomyopathy (the PROTECT study)

    Get PDF
    <p>Background: The benefit of pimobendan in delaying the progression of preclinical dilated cardiomyopathy (DCM) in Dobermans is not reported.</p> <p>Hypothesis: That chronic oral administration of pimobendan to Dobermans with preclinical DCM will delay the onset of CHF or sudden death and improve survival.</p> <p>Animals: Seventy-six client-owned Dobermans recruited at 10 centers in the UK and North America.</p> <p>Methods: The trial was a randomized, blinded, placebo-controlled, parallel group multicenter study. Dogs were allocated in a 1:1 ratio to receive pimobendan (Vetmedin capsules) or visually identical placebo.</p> <p>The composite primary endpoint was prospectively defined as either onset of CHF or sudden death. Time to death from all causes was a secondary endpoint.</p> <p>Results: The proportion of dogs reaching the primary endpoint was not significantly different between groups (P = .1). The median time to the primary endpoint (onset of CHF or sudden death) was significantly longer in the pimobendan (718 days, IQR 441–1152 days) versus the placebo group (441 days, IQR 151–641 days) (log-rank P = 0.0088). The median survival time was significantly longer in the pimobendan (623 days, IQR 491–1531 days) versus the placebo group (466 days, IQR 236–710 days) (log-rank P = .034).</p> <p>Conclusion and Clinical Importance: The administration of pimobendan to Dobermans with preclinical DCM prolongs the time to the onset of clinical signs and extends survival. Treatment of dogs in the preclinical phase of this common cardiovascular disorder with pimobendan can lead to improved outcome.</p&gt

    The shock compression of microorganism-loaded broths and emulsions: Experiments and simulations

    Get PDF
    By carefully selecting flyer plate thickness and the geometry of a target capsule for bacterial broths and emulsions, we have successfully subjected the contents of the capsule to simultaneous shock and dynamic compression when subjected to a flyer-plate impact experiment. The capsules were designed to be recovered intact so that post experimental analysis could be done on the contents. ANSYS® AUTODYN hydrocode simulations were carried out to interrogate the deformation of the cover plate and the wave propagation in the fluid. Accordingly, we have shown that microorganisms such as Escherichia coli, Enterococcus faecalis and Zygosaccharomyces bailii are not affected by this type of loading regime. However, by introducing a cavity behind the broth we were able to observe limited kill in the yeast sample. Further, on using this latter technique with emulsions it was shown that greater emulsification of an oil-based emulsion occurred due to the cavitation that was introduced

    Survey of Sensor Technology for Aircraft Cabin Environment Sensing

    Get PDF
    The aircraft cabin environment is unique due to the proximity of the passengers, the need for cabin pressurization, and the low humidity. All of these aspects are complicated by the fact that the aircraft is a semi-enclosed structure. There is an increased desire to monitor the aircraft cabin environment with various sensors for comfort and safety. However, the aircraft cabin environment is composed of a large number of factors. Some of these factors can include air quality, temperature, level of pressurization, and motion of the aircraft. Therefore, many types of sensors must be used to monitor aircraft environments. A variety of technology options are often available for each sensor. Consequently, a fair number of tradeoffs need to be carefully considered when designing a sensor monitoring system for the aircraft cabin environment. For instance, a system designer may need to decide if the increased accuracy of a sensor using a particular technology is worth the increased power consumption over a similar sensor employing a more efficient, less accurate technology. In order to achieve a good solution, a designer needs to understand the tradeoffs and general operation for all of the different sensor technologies that could be used in the design. The purpose of this paper is to provide a survey of the current sensor technology. The primary focus of this paper is on sensors and technologies that cover the most common aspects of aircraft cabin environment monitoring. The first half of this paper details the basic operation of different sensor technologies. The second half covers the individual environmental conditions which need to be sensed. This will include the benefits, limitations, and applications of the different technologies available for each particular type of sensor

    Computer modeling of diabetes and Its transparency: a report on the Eighth Mount Hood Challenge

    Get PDF
    Objectives The Eighth Mount Hood Challenge (held in St. Gallen, Switzerland, in September 2016) evaluated the transparency of model input documentation from two published health economics studies and developed guidelines for improving transparency in the reporting of input data underlying model-based economic analyses in diabetes. Methods Participating modeling groups were asked to reproduce the results of two published studies using the input data described in those articles. Gaps in input data were filled with assumptions reported by the modeling groups. Goodness of fit between the results reported in the target studies and the groups’ replicated outputs was evaluated using the slope of linear regression line and the coefficient of determination (R2). After a general discussion of the results, a diabetes-specific checklist for the transparency of model input was developed. Results Seven groups participated in the transparency challenge. The reporting of key model input parameters in the two studies, including the baseline characteristics of simulated patients, treatment effect and treatment intensification threshold assumptions, treatment effect evolution, prediction of complications and costs data, was inadequately transparent (and often missing altogether). Not surprisingly, goodness of fit was better for the study that reported its input data with more transparency. To improve the transparency in diabetes modeling, the Diabetes Modeling Input Checklist listing the minimal input data required for reproducibility in most diabetes modeling applications was developed. Conclusions Transparency of diabetes model inputs is important to the reproducibility and credibility of simulation results. In the Eighth Mount Hood Challenge, the Diabetes Modeling Input Checklist was developed with the goal of improving the transparency of input data reporting and reproducibility of diabetes simulation model results

    Multiscale formulation for material failure accounting for cohesive cracks at the macro and micro scales

    Get PDF
    This contribution presents a two-scale formulation devised to simulate failure in materials with het- erogeneous micro-structure. The mechanical model accounts for the activation of cohesive cracks in the micro-scale domain. The evolution/propagation of cohesive micro-cracks can induce material instability at the macro-scale level. Then, a cohesive crack is activated in the macro-scale model which considers, in a homogenized sense, the constitutive response of the intricate failure mode taking place in the smaller length scale.The two-scale model is based on the concept of Representative Volume Element (RVE). It is designed following an axiomatic variational structure. Two hypotheses are introduced in order to build the foundations of the entire two-scale theory, namely: (i) a mechanism for transferring kinematical information from macro- to-micro scale along with the concept of “Kinematical Admissibility”, relating both primal descriptions, and (ii) a Multiscale Variational Principle of internal virtual power equivalence between the involved scales of analysis. The homogenization formulae for the generalized stresses, as well as the equilibrium equations at the micro-scale, are consequences of the variational statement of the problem.The present multiscale technique is a generalization of a previous model proposed by the authors and could be viewed as an application of a general framework recently proposed by the authors. The main novelty in this article lies on the fact that failure modes in the micro-structure now involve a set of multiple cohesive cracks, connected or disconnected, with arbitrary orientation, conforming a complex tortuous failure path. Tortuosity is a topic of decisive importance in the modelling of material degradation due to crack propagation. Following the present multiscale modelling approach, the tortuosity effect is introduced in order to satisfy the “Kinematical Admissibility” concept, when the macro-scale kinematics is transferred into the micro-scale domain. There- fore, it has a direct consequence in the homogenized mechanical response, in the sense that the proposed scale transition method (including the tortuosity effect) retrieves the correct post-critical response.Coupled (macro-micro) numerical examples are presented showing the potentialities of the model to sim- ulate complex and realistic fracture problems in heterogeneous materials. In order to validate the multiscale technique in a rigorous manner, comparisons with the so-called DNS (Direct Numerical Solution) approach are also presented

    System Size and Energy Dependence of Jet-Induced Hadron Pair Correlation Shapes in Cu+Cu and Au+Au Collisions at sqrt(s_NN) = 200 and 62.4 GeV

    Get PDF
    We present azimuthal angle correlations of intermediate transverse momentum (1-4 GeV/c) hadrons from {dijets} in Cu+Cu and Au+Au collisions at sqrt(s_NN) = 62.4 and 200 GeV. The away-side dijet induced azimuthal correlation is broadened, non-Gaussian, and peaked away from \Delta\phi=\pi in central and semi-central collisions in all the systems. The broadening and peak location are found to depend upon the number of participants in the collision, but not on the collision energy or beam nuclei. These results are consistent with sound or shock wave models, but pose challenges to Cherenkov gluon radiation models.Comment: 464 authors from 60 institutions, 6 pages, 3 figures, 2 tables. Submitted to Physical Review Letters. Plain text data tables for the points plotted in figures for this and previous PHENIX publications are (or will be) publicly available at http://www.phenix.bnl.gov/papers.htm

    Improved Measurement of Double Helicity Asymmetry in Inclusive Midrapidity pi^0 Production for Polarized p+p Collisions at sqrt(s)=200 GeV

    Get PDF
    We present an improved measurement of the double helicity asymmetry for pi^0 production in polarized proton-proton scattering at sqrt(s) = 200 GeV employing the PHENIX detector at the Relativistic Heavy Ion Collider (RHIC). The improvements to our previous measurement come from two main factors: Inclusion of a new data set from the 2004 RHIC run with higher beam polarizations than the earlier run and a recalibration of the beam polarization measurements, which resulted in reduced uncertainties and increased beam polarizations. The results are compared to a Next to Leading Order (NLO) perturbative Quantum Chromodynamics (pQCD) calculation with a range of polarized gluon distributions.Comment: 389 authors, 4 pages, 2 tables, 1 figure. Submitted to Phys. Rev. D, Rapid Communications. Plain text data tables for the points plotted in figures for this and previous PHENIX publications are (or will be) publicly available at http://www.phenix.bnl.gov/papers.htm
    corecore